Joy Casino Ап Икс The U.S. Should Adopt the 'Right to Be Forgotten' Online - Open to Debate
aviator 1 win1win aviatorpinup indialucky jet crashpin-up4rabet india1 winlucky jet x1win sayti4era betpin up casino onlinemosbetpinuppin up casinoone win gamelucky jetpin up1win1win online1win aposta4rabetmosbet casinoparimatch4rabet bangladeshmostbet aviator loginparimatchmostbet casinopin upmostbet azmostbet aviatorpinupmostbet kz1 win1win slotmostbet casinomostbetpin up 7771 winmosbet indiapin up1win aviatormostbet aviator1 win1win yüklə1 win azaviatorlacky jetmostbet casinolucky jetmosbet aviatormostbet india
March 11, 2015
March 11, 2015

In 2014, the European Union’s Court of Justice determined that individuals have a right to be forgotten, “the right—under certain conditions—to ask search engines to remove links with personal information about them.” It is not absolute, but meant to be balanced against other fundamental rights, like freedom of expression. In a half year following the Court’s decision, Google received over 180,000 removal requests. Of those reviewed and processed, 40.5% were granted. Largely seen as a victory in Europe, in the U.S., the reaction has been overwhelmingly negative. Was this ruling a blow to free speech and public information, or a win for privacy and human dignity?

08:00 PM Wednesday, March 11, 2015
down

Article 29 Working Party (4 RESOURCES)

  • 00:00:00

    John Donvan:
    When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the “right to be forgotten.” And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ,” a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: “The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online .”

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz.

    [applause]

    John Donvan:
    Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC.

    [laughter]

    John Donvan:
    And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, “You have no right to see me naked,” by which you meant what?

    Paul Nemitz:
    I meant that there are limits to snooping, collecting and making my private life public on Google.

    John Donvan:
    And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument.

    Paul Nemitz:
    The eminent professor Eric Posner from Chicago University Law School .

    John Donvan:
    Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause]
    Yes, Eric, you are a law professor. You’ve written the book most recently “The Twilight of International Human Rights Law.” And back in the U.S., when Europe passed its ” Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now?

    Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough.

    John Donvan:
    E specially after tonight .

    Eric Posner:
    E specially after tonight.

    John Donvan:
    Ladies and gentlemen, the team arguing for the motion that “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online.”

    [applause]

    John Donvan:
    And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin.

    [applause]

    John Donvan:
    Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a “travesty.” You’re no longer at Google, but if you were , would you be using that kind of language?

    Andrew McLaughlin:
    [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty.

    John Donvan:
    Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew?

    Andrew McLaughlin:
    The equally eminent Professor Jonathan Zittrain.

    John Donvan:
    Ladies and gentlemen, Jonathan Zittrain.[applause]

    John Donvan:
    The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s “Right to Be Forgotten,” and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain:
    Well, we could end early and just hit the bar, but —

    [laughter]

    Jonathan Zittrain:
    — I’ve described the “Right to Be Forgotten” as a poor so lution to a very real problem.
    And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes.

    John Donvan:
    All right. We’re very interested to see where you go with that argument. Again, the motion is, “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ,” in this debate ,
    and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, “The U.S.
    Should Adopt the ‘Right to Be Forgotten’ Online.”

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process,
    and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online.” Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz.

    [applause]

    Paul Nemitz:
    Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the “Right to Be Forgotten,” the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them,
    because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, “I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online.” Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18,
    to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much.

    John Donvan:
    Thank you, Paul Nemitz.

    [applause]

    John Donvan:
    And the motion is that “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online.” And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin.

    [applause]

    Andrew McLaughlin:
    Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called “right to be forgotten” in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the “right to be forgotten” is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the “right to be forgotten” is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The “right to be forgotten” does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the “right to be forgotten”.

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the “right to be forgotten” is censorship. Note the odd passive voice in the construction of the right: “The right to be forgotten.” We call it a right, but really,
    it’s a duty. And Paul owned up to this very directly in his opening statement. The “right to be forgotten” is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information —
    and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So,
    this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the “right to be forgotten” as it is, which is a form of censorship. So, as I said,
    censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The “right to be forgotten” does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google —
    well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the “right to be forgotten” undermines our moral standing, vis -a –
    vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the “right to be forgotten” and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example,
    if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth.
    And so finally, I’ll just end on this note, which is that, you know, as George Orwell said,
    “He who controls the past controls the future .” G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion.

    John Donvan:
    Thank you, Andrew McLaughlin.

    [applause]

    John Donvan:
    A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online .” You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause]

    Eric Posner:
    I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called “privacy.” And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this,
    they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws,
    they still exist. There are even laws called, “expungement statutes,” that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, “This is a terrible state of affairs that we have,” because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — “What a ridiculous proposal, what a tremendous invasion of people’s privacy.” But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak,
    and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the “Right to Be Forgotten” would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know,
    in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody
    else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner,
    employer, neighbor, colleague puts her name in — maybe she moves to a new city —
    puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a
    file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors,
    the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us,
    at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the “Right to be Forgotten” Online in the United States. Thank you very much.

  • 00:27:57

    [applause]

    John Donvan:
    Thank you, Eric Posner. And that is the motion, “The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. ” A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy —

    [applause]

    John Donvan:
    — School of Government. Ladies and gentlemen, Jonathan Zittrain.

    Jonathan Zittrain:
    Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ?

    [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests,
    it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know,
    they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph,
    for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose
    of deleting it from Google.

  • 00:30:57

    [laughter]

    Jonathan Zittrain:
    It has to be built into this right that we have no idea what we’re actually talking about,
    which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it,
    because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this
    company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light,
    it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like,
    “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft.

    [laughter]

    Jonathan Zittrain:
    But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it?
    In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved?
    Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, “No longer relevant in the view of the complainant,” information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much.

    John Donvan:
    Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the ” right to be forgotten” online.
    Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the “right to be forgotten”
    online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the “right to be forgotten,” is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out,
    and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is “censorship,” that the kind of right that we’re talking about, where an individual can go to Google and say, “When my name comes up, I don’t want this link to show up or this link to show up or this link to show up.” But that amounts to censorship. How do you respond to that? Paul Nemitz.

    Paul Nemitz:
    I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU “right to be forgotten” as ruled by the European C ourt of J ustice exactly.

    John Donvan:
    Paul, let me step in because I asked you a question, and I don’t feel you’re answering,
    which was the response that they are calling this censorship.Paul Nemitz:
    It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives —

    John Donvan:
    Okay. Let me let the other side respond to that. Andrew McLaughlin.

    Andrew McLaughlin:
    Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship.

    John Donvan:
    When you say — just to be clear, you’re not talk — when we say deletion of information published by others,” that is deletion of — literally of information published by Google,
    which are the search results. But the documents that are linked to stay online.

    Andrew McLaughlin:
    That is correct. I mean, I grant that the Google search result is different from the
    underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So,
    again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story.

    John Donvan:
    But the story is still there.

    Andrew McLaughlin:
    That’s correct.

    John Donvan:
    I’ve got to — so let me — let me bring in Eric Posner.Eric Posner:
    Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs,
    finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a —

    John Donvan:
    Just for folks who aren’t lawyers, can you explain the term “Tort action”?

    Eric Posner:
    You can just sue them and get money. And —

    [laughter]

    Eric Posner:
    As a consequence —

    John Donvan:
    Now we’re talking.

    Eric Posner:
    As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well.

    Andrew McLaughlin:
    Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner:
    Not the subsequent. But the original —

    Andrew McLaughlin:
    But that’s not censorship. That’s a court action–

    Eric Posner:
    the original —

    John Donvan:
    Let’s let Paul — Eric con tinue.Andrew McLaughlin:
    Sorry.

    John Donvan:
    Eric, were you done [unintelligible]?

    Eric Posner:
    Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy.

    John Donvan:
    We’ll take Jonathan Zittrain.

    Jonathan Zittrain:
    I wouldn’t get too hung up on the word “censorship.” I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be,
    you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a “right to be forgotten” implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down,
    enjoy the stacks, to use an example from the ’90s that you might relate to.

    John Donvan:
    Would — would one of you —

    Jonathan Zittrain:
    That’s just seems Borgesian.

    John Donvan:
    I want to allow one of you to a ddress that metaphor. The audience connected with that.

    [laughter]John Donvan:
    So , I want to see what your response to it is.

    Paul Nemitz:
    I can —

    John Donvan:
    Do you want to take it, Paul? All right. Eric Posner.

    Eric Posner:
    Here is another metaphor, so —

    [laughter]

    Eric Posner:
    Well, for example —

    John Donvan:
    It has to be a direct response metaphor or you have to directly respond to their —

    Eric Posner:
    Okay. So the — so —

    John Donvan:
    And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually —

    Male Speaker:
    You know, the [unintelligible]. You just have to rearrange the lett ers.

    Eric Posner:
    I agree — I agree with Jonathan.

    John Donvan:
    Eric Posner.

    Eric Posner:
    I agree with Jonathan that the word “censorship” is not helpful here. What the”Right to Be Forgotten” does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy.

    John Donvan:
    Andrew McLaughlin.

    Andrew McLaughlin:
    I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, “Look, there’s just nothing much to see here,” like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left —

    John Donvan:
    Paul Nemitz.

    Andrew McLaughlin:
    — limited to search engines —

    Paul Nemitz:
    Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case.
    Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out,
    him or herself, into the public, the “Right to Be Forgotten” doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook.

    [laughter]

    Paul Nemitz:
    Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, “I’m a Democrat,” “I’m a Republican.” Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect.

    John Donvan:
    Let me —

    Paul Nemitz:
    Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking —

    John Donvan:
    Okay, Paul —

    Paul Nemitz:
    — or giving you the right to deletion.

    John Donvan:
    — you’ve phrased the question —

    [applause]

    John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect?

    Jonathan Zittrain:
    I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook —

    Paul Nemitz:
    Not today, Jonathan.

    Jonathan Zittrain:
    — well, yeah —

    Paul Nemitz:
    Bu t tomorrow, my friend.

    Jonathan Zittrain:
    — I — I’m saying —

    [laughter]

    Paul Nemitz:
    Right?

    Jonathan Zittrain:
    — yes, yes —

    [applause]

    Paul Nemitz:
    Here we are in America. The innovation of technology doesn’t stop.

    Jonathan Zittrain:
    — yes, yes, and —

    John Donvan:
    Jonathan, I think, though, that the —

    Jonathan Zittrain:– yeah.

    John Donvan:
    — the spirit of his question should not be lost in the technicalities.

    Jonathan Zittrain:
    Yes.

    John Donvan:
    I think he’s asking a serious question here.

    Jonathan Zittrain:
    Just the fact that he’s wrong should not, I agree —

    [laughter]

    Jonathan Zittrain:
    — block the — yeah, that’s true.

    [talking simultaneously]

    John Donvan:
    He might be wrong about what happens with the like button.

    Jonathan Zittrain:
    Yeah.

    John Donvan:
    He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power.

    Jonathan Zittrain:
    I think both are bad situations, and I reject having to choose between the two of them,
    because if that’s the choice I have to make, if it’s really the choice, any time somebody says , “You can have this or you can have every single thing you do on the world be Google -able forever,” I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable —

    That’s what I object to.

    John Donvan:
    Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner.

    Eric Posner:
    These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it.

    Jonathan Zittrain :
    It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan:
    Let’s — can you defer to Paul? Paul Nemitz.

    Paul Nemitz:
    I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and,
    for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights,
    f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down.

    John Donvan:
    Andrew —

    Paul Nemitz:
    On the other hand —

    John Donvan:
    Well, let me —

    Paul Nemitz:
    It is not different from any other newspaper previously. Newspapers have to make these decisions —

    John Donvan:
    Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that?

    Andrew McLaughlin:
    Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past.

    John Donvan:
    Why do you — why do you think that?

    Andrew McLaughlin:
    The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU,
    to be quite honest with you.

    John Donvan:
    So, we’re getting all scientific about it.

    Andrew McLaughlin:
    Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google “right to be forgotten horror stories.” That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an “right to be forgotten.”

    Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot.

    [laughter]

    John Donvan:
    Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up.

    An drew McLaughlin:
    But I — let me just —

    John Donvan:
    It’s not over yet.

    Andrew McLaughlin:
    Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan:
    How about that, Eric Posner?

    Eric Posner:
    Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories.

    Andrew McLaughlin:
    But you could just have a form on Google to type it in —

    Eric Posner:
    They’re the most vulnerable people. The —

    John Donvan:
    But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner:
    Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do.

    John Donvan:
    Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible].

    Jonathan Zittrain :
    We want to give you buy one, get one free.

    John Donvan:
    All right. There you go.

    [laughter]

    John Donvan:
    No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-
    seven seconds is l ost.

    Female Speaker:
    Is it possible that the “right to be forgotten” infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving,
    and another person in the vehicle, you know, gets out and saves a bunch of people or,
    you know, the drunk driver from doing more harm or what have you.

    John Donvan:
    But you recognize that that forgetting will only happen when you put in the driver’s name.

    Female Speaker:
    Right. But still, I mean, that’s what I’m as king.

    John Donvan:Okay.

    Female Speaker:
    Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because —

    John Donvan:
    But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for –

    Female Speaker:
    Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How —

    John Donvan:
    But think — is it —

    Female Speaker:
    Did they infringe upon each other’s [unintelligible]?

    John Donvan:
    Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name?

    Paul Nemitz:
    Haven’t heard about it.

    John Donvan:
    Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you.

    Male Speaker:
    Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate?

    Jonathan Zittrain:
    You got me.

    [laughter]

    Jonathan Zittrain:
    Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know —

    John Donvan:
    Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of —

    Jonathan Zittrain:
    But I think —

    John Donvan:
    — interest in this regard. Do you think this is a relevant question?

    Jonathan Zittrain:
    Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google,
    I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it.

    John Donvan:
    It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain:
    What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying,
    hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan:
    Well —

    Jonathan Zittrain:
    So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that.

    John Donvan:
    Jonathan, that is what’s called “Hitting a curve ball out of the park.”

    Jonathan Zittrain:
    Thank you.

    John Donvan:
    Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the “right to be forgotten” online. Let’s go back to questio ns. Right down in front, sir.
    And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks.

    Male Speaker:
    Hi.

    John Donvan:
    If you can stand up and grab the mic.

    Male Speaker:
    Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not.

    John Donvan:
    Paul Nemitz, in Europe.

    Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know,
    journalism, free speech, no.

  • 01:00:00

    John Donvan:
    Andrew, CEO of Dig g, how do you feel about that?

    Andrew McLaughlin:
    I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the “right to be forgotten.”

    Paul Nemitz:
    All search engines.

    Andrew McLaughlin:
    Paul frames this as just like a very, “Nothing to see here,” kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper.
    What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea.

    Paul Nemitz:
    It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff.

    John Donvan:
    Sir. Can you tell us your name, ple ase?Male Speaker:
    Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the “No” side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the “right to be forgotten” in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific.

    John Donvan:
    Good q uestion.

    Male Speaker:
    And then if, you know, Google wanted to, you know, deny a request or, you know,
    didn’t want to deal with the process, you could have some special advocate appointed to —

    John Donvan:
    Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a “right to be forgotten” workable in the US to the point where you would find it acceptable?

    Jonathan Zittrain :
    It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole.

    Andrew McLaughlin:
    I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea.

    John Donvan:
    Eric, did you want to respond to that?

    Eric Posner:
    It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together,
    meaning that he’s on our side. For the —

    [laughter]

    Eric Posner:
    — you make a — you know, for example, I would completely rule out public figures from using the “Right to Be Forgotten,” only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the “Right to Be Forgotten” were implemented.

    John Donvan:
    [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up —

    Female Speaker:
    [unintelligible] as a follow -up question to yours. In order for your side to win —

    John Donvan:
    Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the “Right to Be Fo rgotten.”

    Female Speaker:
    Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan:
    Eric Pos ner.

    Eric Posner:
    Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done.

    John Donvan:
    I’m going to — unless you really nee d to respond to that question, I’m going to move on.

    Female Speaker:
    For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, “You are being traced,” just to access the suppressed articles, because suppressed is not the same as delete —

    John Donvan:
    Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that.

    Jonathan Zittrain:I —

    John Donvan:
    Thank you for the question.

  • 01:06:56

    Jonathan Zittrain:
    — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, “For anybody interested in receiving communist literature –” I’m not making this up — “you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about.

    John Donvan:
    Another question? Right in the center there. Adamant waving actually does work with me.

    [laughter]

    John Donvan:
    When I see everybody going nuts. Can you tell us your name, please? Again —

    [talking simultaneously]

    John Donvan:
    Yeah, thanks.

    Male Speaker:
    This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us.

    John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are —

    [laughter]

    John Donvan:
    — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric.

    Paul Nemitz:
    Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart.

    John Donvan:
    So , how far are you getting in asking the NSA to delete information?

    [laughter]

    Paul Nemitz:
    Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan:
    Sir in the back near the wall. Thanks.

    Male Speaker:
    Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that.

    John Donvan:
    Can you rephrase what your question specifically is in terms of the motion here?

    Male Speaker:
    In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this?

    John Donvan:
    In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting —

    Male Speaker:
    Not exactly. I’m saying another way of press suppression —

    John Donvan:
    Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater,
    the very far back. Thanks for your question, though.

    Male Speaker:
    Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all?

    [applause]

    Jonathan Zittrain :
    I wo uldn’t want Google to be uniquely privileged to answer that question.

    John Donvan:
    Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 –
    year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now,
    the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old.

    John Donvan:
    From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin.

    Andrew McLaughlin:
    Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still,
    censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do.

    John Donvan:
    Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz:
    Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down,
    and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court.

    Male Speaker:
    But Paul –

    Paul Nemitz:
    This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says,
    be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know —

    Male Speaker:
    Yeah, Paul, I have to — [unintelligible].

    John Donvan:
    [unintelligible].

    Andrew McLaughlin:
    I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered.

    John Donvan:
    But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name.

    Andrew McLaughlin:
    They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that.

    Eric Posner:
    There are interests on both sides.

    John Donvan:
    Eric Posner.

    Eric Posner:
    So , that’s the classic rule for the law so that the interests on both sides —

    Andrew McLaughlin:
    How are they vindicated?

    Eric Posner:
    And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why —

    [talking simultaneously]

    Jonathan Zittrain :
    — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place —

    Eric Posner:
    No, but they don’t want to set a precedence where they –Jonathan Zittrain :
    Google will pay no penalty for over -deleting.

    Eric Posner:
    They will.

    Jonathan Zittrain :
    And no one will even know.

    Eric Posner:
    They will.

    Jonathan Zittrain :
    Who will even know?

    Eric Posner:
    They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other.

    John Donvan:
    And that concludes — that concludes round two of this Intelligence Squared US debate,
    where our motion is the US should adopt the “right to be forgotten” online.

  • 01:17:58

    [applause]

    John Donvan:
    Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission,
    directorate general for justice and consumers.

    Paul Nemit z:
    Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a “right to be forgotten,” not necessarily the one of the EU,
    but a “right to be forgotten.” Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the “right to be forgotten” online.

  • 01:20:06

    Thank you very much.

    John Donvan:
    Thank you, Paul Nemitz.

    [applause]

    John Donvan:
    And that’s our motion, the US should adopt a “right to be forgotten” online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks.

    Andrew McLaughlin:
    You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths,
    more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is,
    which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason,
    you should oppose this motion.

    John Donvan:
    Thank you, Andrew McLaughlin.

    [applause]

    John Donvan:
    And the motion is “The U.S. Should Adopt the Right to Be Forgotten Online.” And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago.

    Eric Posner:
    This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married,
    and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, “He who controls the past controls the future.” Orwell’s invocation was inevitable. But I want to ask you, though, who is the “he” in that statement? Is the “he” you, is the “he” me, or is the “he” Google? Thank you very much.

    John Donvan:
    Thank you, Eric Posner.

    [applause]

    John Donvan:
    The motion is “The U.S. Should Adopt the Right to Be Forgotten Online.” And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society.

    Jonat han Zittrain:
    I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric,
    John, Andrew, and all of you for an unforgettable evening.

    [laughter]
    This has been wonderful.

    Sorry. We’ll strike that later.

    We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — “Well,
    we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out.” But then I come to the question Eric just asked, who is “we”? And in the proposal “we” is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, “When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?” Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition–

    John Donvan:
    Jonathan Zittrain, I’m sorry —

    Jonathan Zittrain:
    — can I end with one thing —

    John Donvan:
    — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan.

    [laughter]

    [applause]

    And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion “The U.S.
    Should Adopt the Right to Be Forgotten Online.” If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side,
    push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause]

    John Donvan:
    And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done .

    [talking simultaneously]

    Jonathan Zittrain:
    H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so.

    [a pplause]

    John Donvan:
    Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center,
    we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the “Right to be Forgotten” Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already?

    Male Speaker:
    It was erased.

    [laughter]

    Male Speaker:
    Does anybody remember?

    [applause]

    John Donvan:
    All right. Well, for the sake of people who are listening —

    [laughter]

    Play along, please.

    [laughter]

    Our motion is this: The U.S. Should Adopt the ” Right to be Forgotten” Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the “Right to be Forgotten” Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent.

    [applause]

    John Donvan:
    The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time.

    [applause]

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

  • 00:00:00

    John Donvan:When you — when you go online and Google yourself — and we know that you do Google yourself. I do it. I did it once. I’ ve done it. Do you always like what it is that you see? Because sometimes some of it can be unflattering. And is that how you want to be remembered, by those links to unflattering things five years from now or 50 years from now?

  • 00:00:01

    Or do you wish that you had the power to get some of those links removed? But if you had that power, what if somebody who had done something bad had that power? What if a doctor who had committed malpractice also wante d to get his links taken down? Would that be a good thing or a bad thing? Well, in Europe, they do have something of a system that allows individuals to do this. They call it the "right to be forgotten." And the question is, should we have that right here? Well, that sounds like the makings of a de bate. So, let’s have it. Yes or no to this statement: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," a debate from Intelligence Squared US. I’m John Donvan. We are at the Kaufman Music Center in New York City with four superbly qualified debaters, two against two, who will argue for and against this motion: "The U.S. Should Adopt the ‘Right to Be Forg otten’ Online.” As always, our debate will go in three rounds, and then our live audience here in New York votes to choose the wi nner. And only one side ones. The motion again, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ."

  • 00:01:04

    Let’s meet the team arguing for the motion. Please, ladies and gentlemen, welcome Paul Nemitz. [applause] John Donvan: Paul, you are the directo r of this — of this organization, this agency : The Fundamental Rights and Union Citizenship in the Directorate G eneral for Justice and Consumers of the European Commission, otherwise known as the FRUCDGJCEC. [laughter] John Donvan: And in that, you are responsible for the enforcement of privacy rights in Europe. And you have been quoted as saying, to Google, you have said, "You have no right to see me naked," by which you meant what? Paul Nemitz: I meant that there are limits to snooping, collecting and making my private life public on Google. John Donvan: And that’s what you’re going to be arguing tonight. And tell us who is your partner in that argument. Paul Nemitz: The eminent professor Eric Posner from Chicago University Law School . John Donvan: Ladies and gentlemen, Eric Posner.

  • 00:01:57

    [applause] Yes, Eric, you are a law professor. You’ve written the book most recently "The Twilight of International Human Rights Law." And back in the U.S., when Europe passed its " Right to Be Forgotten ” law a little while back, most American academics were quite skeptical and even outraged, and you said actually that you thought it sounded perfectly sensible. So , my question is have more of the critics come over to your side now? Eric Posner:No, they haven’t, but I’m hoping they’ll change their minds soon enough. John Donvan: E specially after tonight . Eric Posner: E specially after tonight. John Donvan: Ladies and gentlemen, the team arguing for the motion that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." [applause] John Donvan: And we have two debaters arguing vociferously against this motion. Please, let’s welcome Andrew McLaughlin. [applause] John Donvan: Andrew, you are now the CEO of Digg and Instapaper.

  • 00:02:56

    You were an advisor to President Obama on Internet technology policy. You are a former director of global public policy at Google. You have called the EU decision a "travesty." You’re no longer at Google, but if you were , would you be using that kind of language? Andrew McLaughlin: [laughs] Well, I ’d probably use less restrained language if I were still at Google. The effect of the decision on that company is kind of singular. There’s really no place in the world that’s been so affected by it. But I do think it’s a travesty. John Donvan: Okay, we’re making cle ar you’re not here tonight as the Google guy. And tell us who your partner is, Andrew? Andrew McLaughlin: The equally eminent Professor Jonathan Zittrain. John Donvan: Ladies and gentlemen, Jonathan Zittrain.[applause] John Donvan: The eminent Jonath an Zittrain, you’re a professor of law and computer science at Harvard. You’re a cofounder of the Ber kman Center for Internet & Society. You looked at the EU’s — Europe’s "Right to Be Forgotten," and you said that there is a certain elegance to the idea because it — you’re saying something should not follow you around for life. But does that mean essentially you agree with the other side?

  • 00:04:01

    Jonathan Zittrain: Well, we could end early and just hit the bar, but — [laughter] Jonathan Zittrain: — I’ve described the "Right to Be Forgotten" as a poor so lution to a very real problem. And in that sense I have many sympathies not just with the other side but probably with everyone in the room who wants that ability to get something taken do wn sometimes. John Donvan: All right. We’re very interested to see where you go with that argument. Again, the motion is, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ," in this debate , and it is a debate. This is an intellectual joust where only one side will win, and that side will be chosen by vote of our audience here in New York City. By the time the debate has ended, you’ll have been asked to vote twice on your view on this moti on for or against. And the team whose numbers have changed the most between the first and the second vote will be declared our winner. So , let’s register the first vote. If you go to the keypads at your seat, take a look again at the motion on the screen s. It says, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online."

  • 00:04:59

    Push number one if you’re for the motion, number two if you’re against, and number three if you’re undecided. And we’ll take about 15 seconds to complete that process, and then we’ll lock out the voting devices. All right, folks, let’s lock out those votes. And, again, we’ll have the second vote right after you’ve heard the closing arguments. And then we’ll have the results of the difference between the two votes wit hin about a minute and a half. Our motion is this, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Round one, opening statements from each of our debaters in turn, and here to argue in support of this motion, Paul Nemitz.

  • 00:06:00

    He is director for Fundamental Rights & Union Citizenship in the Directorate General for Justice & Consumers of the European Commission. Ladies and gentlemen, Paul Nemitz. [applause] Paul Nemitz: Thank you very much. Thank you. So let me say, first of all, I’m not here on an official mission from the European Commission to turn America to this notion of law. It’s more the academic pleasure and the opportunity to come to Manhattan and discuss with these g reat American thinkers. And I will just report to you how this works in Europe and where it comes from. So, first of all, the "Right to Be Forgotten," the words , the natural words, they don’t tell you the whole story , because actually it is about a deletion right . I t is about you exercising control over personal data, you asking big corporations or the state to delete this data which they have about you. Why this is important? It is important because it allows you to control your own life.

  • 00:07:00

    In the language of the German constitutional court, already from the ’70s, this was called informational self -determination, and the judges said in a democracy, in a free country, you as an individual must have the right to control what others know about y ou. You must be able to ask “ What do you have?” and you must be able to ask for deletion, of course within reason. You cannot ask a doctor to delete the medical records, because the doctor must keep them either to show what he has done or she has done i n terms of liability. Of course you can’t ask the press to delete a bad review if you are a concert pianist, or for that matter if you are someone who has done malpractice or if you are a politician. There are limits to this, which are limits which are i mportant in a democratic society, because that’s what you will hear from the other side, that this is an attack on democracy. We see that differently in Europe.

  • 00:07:59

    We believe democracy needs both. It needs privacy. It needs the ability of the individual to decide himself or herself what the state in particular knows about them, because if you don’t have the privacy , how can you organize disse nt? How can you organize a new political party, which maybe wants to [unintelligible] , if already the government always knows everything about you. So, we need privacy in a free society as we need free speech, and therefore, and now I turn to the recent judgment. First of all, you have to understand this just doesn’t fall from the sky like this, but it is application of existing law which we have in Europe since 1995. And it is basically a right to ask a big corporation to delete information about you. Google argued in the case,“T his doesn’t apply to us. We are not what we call in Europe a data controller.” Our judges saw that differently.

  • 00:08:59

    They said sure, you control data about individuals and you profile them . When you put in the name of an individual, Google gives you more than anything else, any other source of information about this person , and our law requires that in the same way that our people can ask data brokers, the bank, you know, they can ask in school after a certain time for things to be deleted, they should be able to ask Google for deletion. Now, t he person in question who brought this case from Spain, he had not paid some social contributions and therefore his house had been confiscated and it was a legal obligation in Spain to have a publication in a newspaper of this fact in order to make the auc tioning of the house to cover the social contributions better and attractive for the Spanish state, because if it’s in the newspaper there will be more people bidding for this house.

  • 00:09:56

    So , the court said you cannot ask the newspaper to take down t he information because they have a legal obligation to publish, but Google you can ask. And this is a principle which also applies when it comes to free speech and democracy. It may well be that Google has to put things — take things down, but it doesn’ t mean the information disappears. It will stay in the newspaper. It stays on the website of the newspaper of the BBC, of the television station, all this stays around, but Google is subject to the same law of self -control of information, of self-determi nation of individuals, which allows them to ask first what do you have about me and second, please delete. Does it work? Yes, it works well. I just last week met the chief privacy officer of Google. He told me that now they are dealing with these reque sts in real time. In the beginning there were around 200,000 requests and they had to employ a number of new staff to deal with this, but now it works well.

  • 00:11:03

    There are practically no complaints which do go to the court. Google has understood t he European law of balance between privacy and free speech. The decisions, you know, in the beginning very contested. I think in the meantime everybody says this is working well. And finally, this is something which is important for the future, because more and more data about you will be collected in the digital age, in the internet age, in the age of ever stronger censors, wearables, internet of things and so on. So, when you think tonight about the motion, think about these new censors. Think about the Apple watch. Think about the department store in Manhattan which now follows you on your mobile phone wherever you stand. All this is collected and recorded and there areaddress dealers and information dealers which have files o n you with between 200 and 2,000 parameters, about every person.

  • 00:12:02

    And these files, they are sold . For example, in a political campaign. Those who have more money, they can buy these files and target you as a voter. If you say, "I as an individual want to have control over my data and what’s happening to them and I want my children to have control, and I want to have, as a parent, control over what happens to data of my children, then you should vote in favor of the motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." Last remark: You already have it in some parts of the United States, in some parts of law. For example, in California, there is now a law which gives the po ssibility for parents to say, and children when they become 18, to wipe out everything until they were 18. In the Fair Credit and Reporting Act about your credit, information which is older than seven years, you can ask to be deleted.

  • 00:13:02

    So it is already there in part, and the federal trade commissioner, Julie Brill just yesterday said, “ Wes, this is what we need in America, a right to obscurity. ” Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And the motion is that "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online." And here to make his argument against this motion, Andrew McLaughlin. He is the CEO of Dig and Instapaper. He is a former US deputy chief technology officer for the Obama administration. Ladies and gentlemen, Andrew McLaughlin. [applause] Andrew McLaughlin: Thank you very much. My mission here tonight is not to persuade you that the internet is perfect the way that it is. My mission is to urge you to vote against the resolution. And unlike Paul, I am of the view that this so -called "right to be forgotten" in the EU is an epically bad decision, a travesty, an example of great moral and institutional failure on the part of the European Union itself.

  • 00:14:04

    But my mission here tonight is somewhat more narrow, which is to secure your vote against the motion. As Jonathan alluded to, the emotional underpinnings of this motion, the emotional rationale that lies be hind the "right to be forgotten" is powerful. And it is understandable. There are many people on this planet that have made mistakes in their past, and the internet for them acts as some kind of a permanent record that follows them everywhere. And the fear that it generates is that you will forever be defined by that one mistake that you made, that ended up on the internet or in the press. The problem, though, is the following. And this is what I’m going to try to persuade you of. First, the "right to be forgotten" is, no matter how you try to construct it, censorship. Censorship has to clear a very high bar in order to be justified in a free and democratic society. The "right to be forgotten" does not clear that bar. There are vastly better ways to accommodate the emotional rationale for the "right to be forgotten".

  • 00:15:05

    And that doesn’t even really get into the implementation dilemmas, all of the different practical reasons why this thing will never, in any meaningful sense, become a right. So that’s the outline of the argument that I’m going to make. Let me just begin with a key point, which is the "right to be forgotten" is censorship. Note the odd passive voice in the construction of the right: "The right to be forgotten." We call it a right, but really, it’s a duty. And Paul owned up to this very directly in his opening statement. The "right to be forgotten" is a duty to be able to force other people to forg et what they would otherwise remember. That ability, that right to remember is one of the most fundamental rights I think of being a human being. By suppressing true information — and let’s be very clear, that is what we’re talking about here. We’re talking about the suppression of true information, not libel , not defamation, not hate speech.

  • 00:15:59

    True information intrudes into our collective memory by trying to suppress the documents that constitute that memory. So, we’re talking about the regulation of speech and, really, the regulation of thought which is ultimately what memory is. So, this is the right to force people to forg et true information. Paul said that this is a — the way he framed it was that this is a right to control your ow n life, the right to demand deletion. The problem, though, is that I have the right to control what I remember. I have the right to control what I say. I do not have the right to control what you remember. I do not have the right to control what you say . And so for that reason, we must treat the "right to be forgotten" as it is, which is a form of censorship. So, as I said, censorship in a free and democratic society needs to clear a very high bar to be — to be justified. The "right to be forgotten" does not meet that bar. First of all, it’s way too prone to abuse. It is vague, it is subjective.

  • 00:16:59

    The language in the European courts’ decision is that information should be deleted from searches about the person if it is inadequate, irrelevant or no longer irrelevant or excessive. I challenge you to find a more vague and subjective legal standard anywhere in the law of a free society. It’s very clear, by the way, as a practical matter, that this law favors the interests of well -conne cted elites in Europe, those who are wealthy, those who are in office, those who would like to suppress embarrassing facts about their lives. And — contrary to Paul’s claim that it’s working kind of well, you can Google — well, you can Google while you can — any number of articles. The ones that I found were in the English language press in the UK outlining all of the articles that have been suppressed , because Google g ives notification to the publisher when an article has been suppressed. And so you will find crimes, you will find criticism, you’ll find the case of the piano player who didn’t like a 2010 Washington Post unflattering review of a concert that he gave.

  • 00:18:02

    But there are many, many examples : a lecturer who was suspended got an article about that discipline — disciplinary action suppressed; a man who tried to kill members of his own family was able to have those articles suppressed in searches about his name. And the se are from like 2010, 2011. These are not ancient history. So, one other point that I want to make is that the "right to be forgotten" undermines our moral standing, vis -a -vis dictatorships. In a society with steady rule of law, I suppose it’s conceiva ble that you could have some kind of court mechanism adjudicate the "right to be forgotten" and come up with a body of case law that would define the bounds of the right, flush it out and so forth. But in most of the world, that’s not the case. I guarantee you that the dictators in many parts of the world have a different view about what is relevant and irrelevant about their pasts than we have, and they would gladly use this right to suppress speech and to justify it in their own countries.

  • 00:19:00

    S o , as I said before, there are better ways to accommodate this right. For example, in the American tradition, we believe in not censorship, but more speech. So, for example, if there are links on Google, a better solution might be to say that the subject of the search could add a link, add a comment, or add a symbol that would click you through to whatever you want to say about that link. You might have a right to respond as opposed to a right to censor. Moreover, we’ve seen that the internet adapts. We’ve seen the rise of applications like Snapchat , Wuut, W-U -U -T, which are applications that are designed to not be searchable and to facilitate communication without a permanent record being compiled and assembled. In other words, rather than scrubbing information embarrassing to a person, we might even think more broadly about, as a society, how do we treat past mistakes and past errors? Can we learn to broaden andembrace and allow for the self -reinvention that in so many ways define s what it is to be American ?

  • 00:19:54

    That space for personal development and trial and error as you grow up can be accommodated if, as a society, we learn perhaps not to disenfranchise ex -felons for the rest of their lives, not to penalize them by ma king it difficult to get a job and so forth. And so finally, I’ll just end on this note, which is that, you know, as George Orwell said, "He who controls the past controls the future ." G iving any number of individuals a vague standard by which to contro l their own pasts and thereby try to control their own futures has incredibly negative effects. This right is not necessary. I urge you to vote against the motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: A nd a reminder of what’s going on, we are halfway through this opening round of the Intelligence Squared US debate. I’m John Donvan. And we have four debaters, two teams of two, arguing it out over this motion: "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online ." You’ve heard two of the opening arguments and now on to the third. Let’s welcome to the lecturn, Eric Posner. He is the Kirkland and Ellis distinguished service professor of law at the University of Chicago. Ladies and gentlemen, Eric Posner.

  • 00:21:00

    [applause] Eric Posner: I’d like to ask you to cast your minds back 25 years, 30 years. For those of you who are old enough, or some of the younger people here, I’ll just have to tell you what it was like then, when we enjoyed something called "privacy." And privacy is a complicated concept, so I want to convey it to you with a few examples. So, imagine a 17 -year -old boy, he’s arrested for selling drugs. A news item appears in the local pape r. The police then realize they made a mistake, and they let him go. That’s not published in the newspaper. Now, the people around him in his neighborhood may know about this, they may not know about it. They know a lot about this kid. He moves on with his life. It’s not a big deal. Second example: A single mom is depressed, she has — her kids — her hands full with her kids.

  • 00:22:01

    S he sees a psychotherapist, she misses some credit card bills, files for bankruptcy. But bankruptcy gives her a fresh start. She gets back on her feet, and she’s able to continue with her life. Now, again, her friends and neighbors might know a bit about this, but they also know about her and what she’s like and how she is a good mother. So, they know a li ttle bit about what’s going on but they still have a relatively complete image of who she is. A third example, a married couple has a terrible divorce. They file for divorce in court. They say terrible things to each other, flinging wild accusations, so me of which may contain an element of truth, some of which is exaggerated. They might refer to each other’s infidelity. They might allege that the other spouse neglected the children or said bad things about them.

  • 00:22:57

    Eventually the divorce is settled, and they move on with their lives. So, in all these cases, these events happened. They were public to some extent but at the same time they’re private. Not everybody in the world or even outside this area hears about these events. And the law p rotects them. There are privacy laws — there were privacy laws, they still exist. There are even laws called, "expungement statutes," that would have allowed the teenager, for example, to erase his arrest record after a period of time had passed. And a lthough free speech was robust at that time, so were privacy rights. People understood that while other people would have an interest in knowing things about you and talking about you, individuals also have an interest in controlling information about the mselves, not necessarily political information, it could just be personal information, things that are embarrassing. Now, I want you to imagine back then in 1990 that an academic — a kooky academic like me or Jonathan came up with the following proposal.

  • 00:24:00

    We would say, "This is a terrible state of affairs that we have," because, after all, if a stranger meets someone — let’s say it could be an employer or a creditor or a future romantic partner — it’s really in the stranger’s interest to be able to know as much about this person as possible. Maybe a future employer would want to know that this kid had been arrested. Even if he wasn’t convicted, it’s still information that would be useful. So to solve this problem of massive social ignoranc e, what we should do is record everything that everybody does, put it onto a searchable database, and make this database available to everybody in the world. Okay, so if somebody had made a proposal, even someone like Jonathan, smart guy, or me, not so smart guy, we would have been laughed out of the room, that — "What a ridiculous proposal, what a tremendous invasion of people’s privacy." But, of course, this is what’s happened. It’s happened over the last 20 or 30 years, not because of a conscious dec ision by the publicor by the government to implement this crazy recording system, but because of the development of the Internet.

  • 00:25:04

    Now, what happened? The law has remained the same, but a very important element of privacy back in the ’80s and ’90s and before then was simply the physical configuration of the environment. It was just very difficult to collect information about people and disseminate it. Okay . That’s completely changed. Technology has changed. So the law at the time — th e privacy laws at the time could be relatively weak, and yet people’s privacy was adequately protected, but now, with this technological change, this old balance between my interest in controlling information about myself and your interest in knowing about me has been upset. And all that the "Right to Be Forgotten" would do is restore this balance . Okay, so, if we go through these three people again, this kid, 10, 15 years later — let’s suppose all of this happened, you know, in the last few years — he applies for a job, the employer puts his name in Goo gle, first thing that comes up is this arrest.

  • 00:26:02

    Now, the employer might understand that this is not important, but he might. This is the only thing that comes up about this guy. Why not pass him by and hire somebody else? Maybe the single mother’s friends, you know, in the ’90s they would have gossiped when maybe today they would write about it on Facebook. Maybe she would’ve written about it on Facebook or on a blog. Now a potential romantic partner, employer, neighbor, colleague puts her name in — maybe she moves to a new city — puts her name in and immediately finds out that she had a mental illness. That’s how she’s going to be defined to the public. And in the case of the divorced couple, in the old days divorce would be public. It’s public information. But it was, you know, put in a file that was stored in the basement of a courthouse. If you wanted to, you could find out about that information, but nobody had any interest at least with respect to ordinary people.

  • 00:27:03

    Nowadays, you know, their children could type their paren ts ’ names into Google and out comes these allegations. And, of course, not only their children, but future employers, romantic partners, friends, colleagues, anybody in the world, anywhere they go. So , they’ve lost control over this information about th emselves and it’s not just that they’ve lost this control, they’ve lost the important context. You know, the neighbors, the people in the neighborhood, they know the context of these events. The strangers who do the Google search do not. So, the right to be forgotten would simply restore us, at least partly, to this older period of time. People were happy with the balance between privacy and speech in those days, and I think people would be happy with thisbalance again if privacy were improved somewhat. So, for that reason you should vote in favor of the motion for the "Right to be Forgotten" Online in the United States. Thank you very much.

  • 00:27:57

    [applause] John Donvan: Thank you, Eric Posner. And that is the motion, "The U.S. Should Adopt the ‘Right to Be Forgotten’ Online. " A nd here to make his argument against this motion, please welcome Jonathan Zittrain. He is the George Bemis Professor of Law at Harvard Law S chool and the Kennedy — [applause] John Donvan: — School of Government. Ladies and gentlemen, Jonathan Zittrain. Jonathan Zittrain: Thank you very much, John, and thank you all. I guess I should start with some points of agreement. So, if the question is are there problems with privacy ? Y es , there are. And if that were the motion I would hope you would join us in voting for it. If the question were have problems of privacy gotten more difficult over time ? We’d vote for that motion, too. I think I’m comfortable saying Andrew would vote for it. If the motion were should everything be recorded at all times and if so, should we do something about that? I would also be in favor, but none of these is the motion.

  • 00:28:54

    The question is about a right to be forgotten and I want to explain my opening statement that that’s a very bad solution to a very real problem. So, let’s take an example. Imagine t hat there was something that I could say right now that would be not felicitous for someone in the world. That’s not that hard to imagine. And this debate gets recorded, the podcast goes up, and at some point that aggrieved person files a dispute with Go ogle over the existence of this recording, the web page hosting it. Google will receive that request, that complaint, and it is bound not to tell anybody about it. Google then thinks about it. How does Google think? Companies don’t think except with AI. Eventually this will be AI, but not yet. So, they have I assume some interns ? [laughter]S omebody making $10 an hour that is just getting a stream of those 100,000 requests, it’s like time to make the donuts, you’re like yes, no, no, and Paul’s like, you know, they’ve really learned.

  • 00:30:06

    They’ve gotten good at this, and that’s because if they don’t grant the request, the aggrieved person can appeal in Europe to a data protection authority. I presume ther e’d be something similar here. That’s bad. That’s more process. If Google grants the request, that’s it. Intelligence Squared wou ldn’t be informed that their page is no longer findable through the world’s most powerful information resource? There’d be no appeal by anyone else . A t the moment w hen it is granted Google notifies the Telegraph, for example, if a story on the Teleg raph is taken out of the Google index as a result of this right . The European Union is fighting that notification, because it observes when the Telegraph puts it out, here’s what was deleted today, it kind of defeats the purpose of deleting it from Google.

  • 00:30:57

    [laughter] Jonathan Zittrain: It has to be built into this right that we have no idea what we’re actually talking about, which is why, oddly enough, as much as Google hates this gift, they’re like looking at the box and it’s ticking, I’m hoping they don’t open it. I’m hoping they continue to fight it, because at some point they may make accommodation with it and when they do it will mean that the one protection we have had, apologies, Andrew, to your former employer, against Google basically running the world is that even Google doesn’t know what comes up in Google sea rch results. It’s a roulette wheel even to them, but once they cross that blood- brain barrier and start hand -tweaking results on request, like a DJ taking in what somebody wants, well, why shouldn’t they just do it generally to improve things? And at that point the reality we experience will be one set by this company. And that’s what’s so strange. If there’s going to be something taken out, let a court adjudicate it.

  • 00:32:01

    Let it do so i n camera. That is to say in secret so the fact that it’s adjudicating doesn’t spill the beans. That should be extremely rare. It should be for corner cases, maybe for cases of the sort that Eric was br inging up. But it cannot be hundreds of thousands of things getting processed because it won’t just be AI on Google ’s side taking it out, it’ll be AI on the complainant’s side. I’ll write a check to somebody every month, a company that will go ahead, and every time it sees my name in what might not be a good light, it’ll automatically file a request on my behalf, and I don’t even need to be any thewiser. This starts to be eliminating the middle man so much that it eliminates us from the process of under standing the world. Now, I think I should end, then, by talking a little bit about the ecosystem in which this ill -considered proposal has its best shot at succeeding. And it’s today’s ecosystem. And basically, it’s just Google.

  • 00:33:00

    I’m like , “Bing , how many of you are getting ?” They’re like, “We can’t tell you. ” It’s like, “ Y ou can’t tell us because there’s so many or because nobody is bothering to take anything out of Bing? ” Apologies to Microsoft. [laughter] Jonathan Zittrain: But if we ended up with Wikisearch — with — do you remember Dogpile? It would do searches of multiple search engines at once. Does each company now have to start fronting a system to start taking stuff down? What if it is Wiki? What if it’s a nonprof it? In fact, what if it’s a Wiki page of Wikipedia that’s taken down, and the page gets changed so the objectionable material is gone, it’s been e dited out, it’s been improved? Who goes back to undelete the hole in the Swiss cheese? The answer is no one. Until these questions can be answered, and maybe they can, back to you guys, I say, please vote against this motion even as you pluck a violin along with us for the very real problems that the for side is rightfully bringing up.

  • 00:34:02

    We ‘ve got to work on this stuff. Maybe court records shouldn’t be immediately Google searchable. There should be some frame that keeps that cat a little bit in the bag. We can go case by case and try to come up with categories of information. But if we just stick with a generalized right for, in the words of the decision by the European court of justice, "No longer relevant in the view of the complainant," information to be taken out and procedures that must, by their nature, be secret, we will be betrayi ng all of the great things that the information revolution has brought us along with the bad. Thank you very much. John Donvan: Thank you, Jonathan Zittrain. And that concludes round one of this Intelligence Squared US debate, where our motion is the US should adopt the " right to be forgotten" online. Keep in mind, please, how you voted at the top of the eve ning. Again, we’re going to have you vote a second time after you have heard all of the arguments. And it will be the team whose numbers have moved the most who will be declared our winner.

  • 00:35:03

    On to round two. Round two is where the debaters ad dress one another in turn and take questions from me and from you in our live audience he re at the Kaufman Center in New York City. Our motion is, the US should adopt the "right to be forgotten" online. We have heard one team arguing for this motion. Pa ul Nemitz and Eric Posner have argued that technological change means that the balance that once existed between privacy and free speech is now all wrong, and it needs to be restored in a world where the worst moment in a person’s otherwise well- lived life can be the thing that stalks them through forever through search results on Google. They say put this in terms of a right to delete. Think of it in terms of informational self -determination. The team arguing against the motion, Andrew McLaughlin and Jo nathan Zittrain, they concede that this really is a problem.

  • 00:35:57

    T his issue of privacy, but they say the solution, the "right to be forgotten," is a terrible solution because it conflicts, number one, with the right to remember, which they argue is an important and valid right. And more importantly, they say the whole thing smacks of censorship that will lead to a system where nobody knows what’s being taken out, and stuff taken out may never once again return to the public realm. I want to go to the team that’s arguing in support of this motion. The dirty word that your opponents have raised repeatedly is "censorship," that the kind of right that we’re talking about, where an individual can go to Google and say, "When my name comes up, I don’t want this link to show up or this link to show up or this link to show up." But that amounts to censorship. How do you respond to that? Paul Nemitz. Paul Nemitz: I heard this very clearly. This would be a tool for dictators. And our moral standing would be undermined if we allow this right to be exercised in democratic societies.

  • 00:36:57

    I would see it exactly the other way around. History has taught Europeans, whether fascistic dictators or communist dictators they don’t want any privacy. They wanted to know everything about you. There was a block reporter in every block of housing. And that is the history of this right you have to think about. Dictators don’t want anybody to have privacy. So, I think the understanding where we are coming from is d ifferent, and I would like to , like Jonathan, distinguish this notion from other things you might be thinking about. The notion here tonight is not the US should adopt the EU "right to be forgotten" as ruled by the European C ourt of J ustice exactly. John Donvan: Paul, let me step in because I asked you a question, and I don’t feel you’re answering, which was the response that they are calling this censorship.Paul Nemitz: It’s not censorship because the state is not making information disappear as they did in the communist and the Nazi dictatorships.

  • 00:38:02

    It’s an individual who wants to have control over their past lives — John Donvan: Okay. Let me let the other side respond to that. Andrew McLaughlin. Andrew McLaughlin: Well, that is censorship, right? So in this case, the law, which is an instrument of the state, is saying that the individual has the right to force deletion of true information published by ot hers. That’s censorship. John Donvan: When you say — just to be clear, you’re not talk — when we say deletion of information published by others," that is deletion of — literally of information published by Google, which are the search results. But the documents that are linked to stay online. Andrew McLaughlin: That is correct. I mean, I grant that the Google search result is different from the underly ing information, but unless you can find it, it may as well not exist. And search is how people get at things. Let me just give one example, which is one of the ones which was reported by the daily mail. They took out a link to a 2002 story that reported somewhat amusingly on a dispute between two families in the north of England over a terrier dog .

  • 00:39:05

    And so that search result is now deleted as to that person mentioned in the story. So, again, this is true information. It did happen. The story reported on it, and now the EU says the individual mentioned in the story had the right to force the deletion as to them of that story. John Donvan: But the story is still there. Andrew McLaughlin: That’s correct. John Donvan: I’ve got to — so let me — let me bring in Eric Posner.Eric Posner: Yes. Censorship in the sense that Andrew used it — uses it has always existed in this country. There is a tort right to privacy. If a nosey journalist pokes into your affairs, finds out true facts, tries to publish them, you have a tort action against them. You can get damages as a — John Donvan: Just for folks who aren’t lawyers, can you explain the term "Tort action"? Eric Posner: You can just sue them and get money. And — [laughter] Eric Posner: As a consequence — John Donvan: Now we’re talking. Eric Posner: As a consequence of that, journalists do not do this. Many of you have seen, on reality TV shows and other places that faces are blurred out. Well, that’s a kind of censorship as well. Andrew McLaughlin: Eric, does that tort extend to a journalist who — or let’s say a ne wspaper that writes a story about that dispute?

  • 00:40:05

    Eric Posner: Not the subsequent. But the original — Andrew McLaughlin: But that’s not censorship. That’s a court action– Eric Posner: the original — John Donvan: Let’s let Paul — Eric con tinue.Andrew McLaughlin: Sorry. John Donvan: Eric, were you done [unintelligible]? Eric Posner: Yeah, just that the — there is censorship in the sense that the government prohibits you from reporting true facts. And the reason for that censorship is that there’s also a strong interest in privacy. John Donvan: We’ll take Jonathan Zittrain. Jonathan Zittrain: I wouldn’t get too hung up on the word "censorship." I think that it’s — it should be starting a conversation rather than ending it. The key distinction in the examples Eric was giving was that these are adjudicated instances where the state is like, what that person h as said is so terrible they owe money. It might be a further move by the state with some trepidation to say, and you must say that again, and what you said must be, you know, stricken from the public record. That should be done very rarely and after a lo t of process.

  • 00:40:56

    What’s happening here, in a "right to be forgotten" implemented online is exactly the opposite. And in that sense, it’s very different from the e xamples you’re coming up with. The fact, too, that it’s only coming out of Google’ s index rather than the root thing being taken down, I would just see that as part of the paradox. If the thing is okay enough to stay up, it’s not so bad that it needs to come down, it’s like saying the book can stay in the library. We just have to set fire to the card catalog, like go to down, enjoy the stacks, to use an example from the ’90s that you might relate to. John Donvan: Would — would one of you — Jonathan Zittrain: That’s just seems Borgesian. John Donvan: I want to allow one of you to a ddress that metaphor. The audience connected with that. [laughter]John Donvan: So , I want to see what your response to it is. Paul Nemitz: I can — John Donvan: Do you want to take it, Paul? All right. Eric Posner. Eric Posner: Here is another metaphor, so — [laughter] Eric Posner: Well, for example — John Donvan: It has to be a direct response metaphor or you have to directly respond to their — Eric Posner: Okay. So the — so — John Donvan: And there’s a point there. The point they’re making is it’s not censorship of the source material. It’s still there.

  • 00:41:58

    You can go find it if you dig enough, that it’s actually — Male Speaker: You know, the [unintelligible]. You just have to rearrange the lett ers. Eric Posner: I agree — I agree with Jonathan. John Donvan: Eric Posner. Eric Posner: I agree with Jonathan that the word "censorship" is not helpful here. What the"Right to Be Forgotten" does is it raises the cost for strangers to find out information about you. It doesn’t make it impossible. So, if you turn out to be a famous important person, become a politician, journalist s, biographers can dig it out. It just makes it difficult. But that’s the way things work. That’s the way we secure our homes, for example. We don’t make it impossible to break into our house. That would be far too expensive. We put in a lock that will deter most people. It won’t deter people who are extremely interested in getting into our house. That’s — so yo u raise the cost without making something impossible. That’s a way of putting a thumb on the scale for privacy. John Donvan: Andrew McLaughlin. Andrew McLaughlin: I think what’s — you know, what’s interesting about this to me is I regularly hear EU officials, you know, say, "Look, there’s just nothing much to see here," like this is — only if you search for the person’s name directly will that search result not ap pear.

  • 00:43:04

    It’s just a very small thing. And to me that actually highlights the kind of moral bankruptcy of this whole thing because they are essentially validating the notion that an individual, a dictator, a politician, a rich person can secure the deletion of irrelevant information, whatever that means, in this one very narrow way. And, believe me, I am confident that, that principle will expand. There’s no way that it’s just going to be left — John Donvan: Paul Nemitz. Andrew McLaughlin: — limited to search engines — Paul Nemitz: Yes, I think what Andrew and Jonathan are do ing here is they’re pretending that everybody who makes use of this right is someone with some public interest to know actually what this person has been doing and what it is about. But that’s not the case. Most of these requests have — more than 70 percent — have nothing to do with public interest.

  • 00:44:01

    So , I would like to say very clearly here, whenever there is a public interest of the nature which is impo rtant in a democracy to know, an interest in someone who stepped out, him or herself, into the public, the "Right to Be Forgotten" doesn’t apply. The example,for example, which Andrew gave, from the Washington Post, namely that a concert pianist was asking for a takedown of the critique — the bad critique of his piano performance here in America, yes, the person ask ed for it, but under our law it would never be right to take this down because this person ha ssteppe d into the public arena. So, I think hon estly we have a fake debate here. What we need to debate is the chilling effect on democracy and on public discussion. What chilling effect is bigger, if we are able to know everything about you and even in a way which you wouldn’t even think of or whether we allow individuals to have control over the data.

  • 00:44:58

    I give you one example what you’re not thinking of today. You’re not a very political person maybe but from time to time you push a like button on Facebook. [laughter] Paul Nemitz: Okay? Because you have political thinking and a little bit — but you don’t want to go out in public and say, "I’m a Democrat," "I’m a Republican." Tomorrow Facebook will aggregate and map and link to your person the political profile from your Facebook buttons, whether you want it or not. That’s the chilling effect. John Donvan: Let me — Paul Nemitz: Because when that’s happening tomorrow, you will not even push the Face book buttons anymore. So , the issue here is what is more of a chilling effect on a democracy, if big corporations and the state can know everything about you and profile you in a way that you wouldn’t even dream of thinking — John Donvan: Okay, Paul — Paul Nemitz: — or giving you the right to deletion. John Donvan: — you’ve phrased the question — [applause] John Donvan:– twice, and it’s great. And I want to take it to Jonathan Zittrain. Which is the more chilling effect? Jonathan Zittrain: I was just hearing Paul talk about people clicking like buttons and then thinking ther e’s a pipeline of that onto the open web that Google then indexe s, so that’s factually not true.

  • 00:46:05

    Facebook — Paul Nemitz: Not today, Jonathan. Jonathan Zittrain: — well, yeah — Paul Nemitz: Bu t tomorrow, my friend. Jonathan Zittrain: — I — I’m saying — [laughter] Paul Nemitz: Right? Jonathan Zittrain: — yes, yes — [applause] Paul Nemitz: Here we are in America. The innovation of technology doesn’t stop. Jonathan Zittrain: — yes, yes, and — John Donvan: Jonathan, I think, though, that the — Jonathan Zittrain:– yeah. John Donvan: — the spirit of his question should not be lost in the technicalities. Jonathan Zittrain: Yes. John Donvan: I think he’s asking a serious question here. Jonathan Zittrain: Just the fact that he’s wrong should not, I agree — [laughter] Jonathan Zittrain: — block the — yeah, that’s true. [talking simultaneously] John Donvan: He might be wrong about what happens with the like button. Jonathan Zittrain: Yeah. John Donvan: He’s talkin g about which is the more chilling effect, knowing that you’re out there in a world where you have to be extremely careful about what you say because it might come back to bite you or the alternative where you actually — it’s more chilling for some entity to have that power or individual to have that power. Jonathan Zittrain: I think both are bad situations, and I reject having to choose between the two of them, because if that’s the choice I have to make, if it’s really the choice, any time somebody says , "You can have this or you can have every single thing you do on the world be Google -able forever," I’ll probably take this without even looking inside the box.

  • 00:47:07

    But that’s a false dichotomy . I think there is plenty, that Paul is doing good work that Paul is doing in the EU to look at a Facebook and say gees, you guys are gathering so much stuff before you have a Valdez- like spill or just open up the ports and spill the oilout, because why not if that’s what you’re w anting to do Facebook. You’d better talk with us because that’s personally sensitive data. Great. Regulate it at the collection and at the source, but by the time you have a general web and stuff out on it and are expecting anybody who dares to index it and I’m, again, hoping it will be more than Google who does this, please Bing, do it too, please people we haven’t heard of, do it too. The idea that they have to be handling one request at a time under incredibly vague criteria that are by their nature unreviewable — That’s what I object to. John Donvan: Let’s take the criteria to your opponents.

  • 00:48:01

    These criteria are no longer relevant, inadequate information, and I’m not sure that I understand what it means for information to be inadequate. Inadequate to whom? For what purpose? Eric Posner. Eric Posner: These complaints are generic in legal debates. Every legal standard is ambiguous, so these old privacy rights, for example, require the government or the courts or juries to balance the public’s interest in something, a vague term. Relevance is a term that comes up over and again. So, this happens and what happens is through repeated decision making it becomes clearer and clearer what the actual rules are. Now, the rules as they’re st ated are vague, but when you actually read the cases it becomes pretty clear what’s going on. Ordinary people, non -famous people are the ones who win.

  • 00:48:58

    F amous people don’t win. Ordinary people, and not ordinary people who commit crimes or serious crimes, and there’s frequently a time component, you know, five, 10 years ago, and you can, by reading these things, get a pretty good sense of what the standard is and be able to make predictions about how Google will apply these standards in the future . It’s just something that happens over and over again in the law and people are able to work with it. Jonathan Zittrain : It happens over and over again in the law. That’s what happens when cases are heard by courts. In public session decisions are reported and things evolve. Eric, I want to ask you directly, in this right to be forgotten when you have a private company making the decision at the insistence of the government it must keep its secret, the cases we hear about are happenstance, how do es that evolution take place?John Donvan: Let’s — can you defer to Paul? Paul Nemitz. Paul Nemitz: I mean, these decisions have taken place since decade s in normal press law in Europe.

  • 00:49:56

    Because before the Google case, as I explained to y ou, we had the right to deletion and, for example, you know, newspapers they have to make on a daily basis a decision do we publish something or not, and if they get a request, for example, to correct or to take down from a website some information, they have to make this decision. So, it’s true that if you only read this one judgment, and that’s normal in the law, you don’t understand what these criteria mean. But if you go to European court of human rights, f or example, the Von Hannover case, very clear criteria. What are they? A) whenever the information is of relevant, of public interest in a democracy as a contribution to public debate, it cannot be taken down. Second, when the person in question has made himself or herself the dec ision to go public, into a public sphere like, for example, the concert artist, the information cannot be taken down. So, I would say Eric has said it right.

  • 00:50:56

    All the many examples which you invoke of people who tried, I’ve heard here some examp les, the guy who did the professional [unintelligible] the concert pianist who asked for critique to be taken down, all this under the law should not be taken down. John Donvan: Andrew — Paul Nemitz: On the other hand — John Donvan: Well, let me — Paul Nemitz: It is not different from any other newspaper previously. Newspapers have to make these decisions — John Donvan: Paul, I need to just in the interest of time I need to interrupt you for that, so forgive me. I want to go to Andrew McLaughlin to respond to the point that Paul was justmaking. In your opening statement you said that you could foresee a world in which the powerful elite had more ability to make themselves forgotten than the ordinary guy and Paul is actually saying it’s the ordinary guy who’s getting the take downs and not the powerful and the public. What about that? Andrew McLaughlin: Well, the heart of my argument is not that somehow it’s only going to be used by the elite. I have no idea whether the majority of cases are being used by famous people or not famous people. I have no idea.

  • 00:51:56

    My point is that this right is, in my judgment, a typical kind of like EU elite construction dressed up as a privacy right, but in fact designed to allow the well -connected, t he politicians, the business people to suppress embarrassing facts about their past. John Donvan: Why do you — why do you think that? Andrew McLaughlin: The record is — wel l, it’s part of my sort of cynical experience with the institutions of EU, to be quite honest with you. John Donvan: So, we’re getting all scientific about it. Andrew McLaughlin: Let me say — but let me say. And it’s not like the US is any better, by the way. It’s just the way that it take s form over there. What I would urge e verybody to do, while you still can, is Google "right to be forgotten horror stories." That was my search, and I produced a bunch of articles, the Guardian, the Daily Mail, that are absolutely contrary to Paul’s assurances that this is somehow only being used in cases which are not in the public interest. Footnote, what is in the public interest, of course, depends on who you ask.

  • 00:52:54

    So, for example, there is a story — I’m just taking one at random here, where a Scottish premier league referee named “Dougie” McDonald — there was an article from 2010 about how he was found to have lied about a penalty call that he made in a Celtics versus Dundee United match, the backlash to which prompted his resignation. That story is now subject to the Europe an "right to be forgotten." Jonathan Zittrain :Well, there goes this debate. It’s now going to be deleted again. Thanks a lot. [laughter] John Donvan: Only if you use that guy’s name. But fortunately, Jonathan, your name will still bring it up. An drew McLaughlin: But I — let me just — John Donvan: It’s not over yet. Andrew McLaughlin: Let me make just one other point, though. So — which is that I very much agree with Paul that what is in front of us here is a sort of a false choice. I disagree on what those false choices are. Paul has set it up so that we’ve got censorship and surveillance, or if you want to say it the other way around, free speech and privacy, and that these two things are intentioned and that we have to betray fr eedom of expression in order to vindicate privacy. I don’t agree with that. And the question that I would toss back over is, why not treat this in the classical liberal tradition by responding with a right to more speech, a right to add your speech to th e speech that is being put forward.

  • 00:54:06

    John Donvan: How about that, Eric Posner? Eric Posner: Well, the poor people, the non -elites that you’re talking about, they don’t have the resources to take out newspaper ads, pay journalists or PR people to plant stories. Andrew McLaughlin: But you could just have a form on Google to type it in — Eric Posner: They’re the most vulnerable people. The — John Donvan: But, actually, to address Andrew’s point, he’s saying put a form on Google where it pops u p with your search results are now on — here’s — here’s my response to what’s being [unintelligible].Eric Posner: Yeah, but Google hasn’t done that, you know? If Google were held legally liable, maybe they would, right? So, Google has not done that because they don’t think that’s the profit- making thing to do. John Donvan: Okay, I also want to point out, your — your mission here is not to solve the problem. As you stated, your mission is to say that their solution is not [unintelligible]. Jonathan Zittrain : We want to give you buy one, get one free. John Donvan: All right. There you go. [laughter] John Donvan: No, I mean, it’s — I think people want to hear your idea, but I just want to say for the point of the vote and the debate, you do not have to solve the problem for them.

  • 00:54:58

    I want to see if there are any audience questions. I’d like to start going to those. Ma’am, right here. And a mic will come to you, if you can just wait till it reaches you. And again, I want to ask you, please be terse with the questions and really get to the question like directly. Is there a microphone coming down the aisle? You know what? Okay. I was going to say, we’ll go to this side and then come back to you. Thirty-seven seconds is l ost. Female Speaker: Is it possible that the "right to be forgotten" infringes upon someone’s right to be remembered? So, in other words, a situation where one person is maybe drunk driving, and another person in the vehicle, you know, gets out and saves a bunch of people or, you know, the drunk driver from doing more harm or what have you. John Donvan: But you recognize that that forgetting will only happen when you put in the driver’s name. Female Speaker: Right. But still, I mean, that’s what I’m as king. John Donvan:Okay. Female Speaker: Is there a case, or have there been cases, where there is a, you know, controversy about whether it’s okay for, you know, I want it to be remembered because — John Donvan: But you want it remembered that the — you would want — in a sense, you’re saying that you would want the driver’s name to — he would want him to have to pay that price for — Female Speaker: Yeah, or a neighbor.

  • 00:56:07

    I want it remembered that the neighbor, you know, did thus and so, and the neighbor wants it forgotten that they did such and so. How — John Donvan: But think — is it — Female Speaker: Did they infringe upon each other’s [unintelligible]? John Donvan: Let’s take that to Paul because you’ve actually worked in im plementation. Are there cases where individuals who are otherwise mentioned in this story, that won’t be linked under the other guy’s name, push back and want that link to show up under the other guy’s name? Paul Nemitz: Haven’t heard about it. John Donvan: Okay. That’s an interesting question, though. Down in the front here, sir. That mic is coming down behind you. Male Speaker: Thank you. Richard Faulk enrath. I’m a past debater. Great debate. Thank you very much. I’m surprised that the concentration of economic and informational power in Google has not come up here. And Jonathan, I’m going to pose as a question directly to you, should the audience conclude anything from the fact that Google is a financialsupporter of the institution that you cofounded and are the faculty director of at Harvard?

  • 00:57:06

    Is that relevant in any way in your view to this debate? Jonathan Zittrain: You got me. [laughter] Jonathan Zittrain: Ho w pleased I am that the Berkman Center discloses all its funders on its page. On my page I disclose everything I do, et cetera, et cetera, right down to the, you know — John Donvan: Let me just say we were very comfortable with Jonathan taking part in this panel, and we don’t think he has a conflict of — Jonathan Zittrain: But I think — John Donvan: — interest in this regard. Do you think this is a relevant question? Jonathan Zittrain: Well, I think it’s relevant in the sense that if somehow that were awkward, and I wouldn’t — even if I were feeling, geez, I wish that is now not the third hit on Google, I’ve got to contend with the fact that it is if it is. If somehow our exchange blew up and went viral, and then I felt bad about it. John Donvan: It’s alre ady happened, apparently. Twitter is going crazy.

  • 00:57:58

    Jonathan Zittrain: What are you going to do? But the idea that the recourse should take the form of going in front of Google and, in fact, I don’t know, calling a few friends at Google and saying, hey, just so you know, I shot a request over. I’d really like you guys to give it its fullest consideration, there’s no accountability. And Google wants to have more accountability and is being told it cannot because it cannot disclose its internal p rocesses.John Donvan: Well — Jonathan Zittrain: So , I worry about Google’s market concentration. I feel I’ve been making that argument today. And, oddly, this solution cements that. John Donvan: Jonathan, that is what’s called "Hitting a curve ball out of the park." Jonathan Zittrain: Thank you. John Donvan: Well done. I want to remind you we’re in the question and answer section of this Intelligence Squared US debate. I’m John Donvan, your moderator. We h ave two teams of debaters, two against two, arguing for and against this motion: The US should adopt the "right to be forgotten" online. Let’s go back to questio ns. Right down in front, sir. And if you can wait for the mic and stand.

  • 00:58:57

    And I wanted to ask everybody to please mention their name when they — second row here. Thanks. Male Speaker: Hi. John Donvan: If you can stand up and grab the mic. Male Speaker: Hi. I’m Adam. This — this law seems to be very Google specifically tailore d. And I’m wondering what sort of criteria is applied in Europe about what constitutes a data manager or a search engine. For example, if somebody posted a story about me on Dig g that I didn’t like, that rocketed up to the front page, well, dig g.com has an extensive search feature. Would that qualify as somebody who I could then demand that my data be removed or not. John Donvan: Paul Nemitz, in Europe. Paul Nemitz:This judgment is specific to Google as a search engine. Now, when — and it distinguishes the law which applies to the search engine from the law which applies to the press — to the newspaper, which was also part of the case. So, for the time being, I would say it only applies to the search engine. But when we get into the area of, you know, journalism, free speech, no.

  • 01:00:00

    John Donvan: Andrew, CEO of Dig g, how do you feel about that? Andrew McLaughlin: I mean, I assume that I would be subject to it if I went over there. And I believe that Bing and Yahoo! are now responding to these requests. And I believe it’s the view of the European commission that all search functions, all search features, all searc h engines are subject to the "right to be forgotten." Paul Nemitz: All search engines. Andrew McLaughlin: Paul frames this as just like a very, "Nothing to see here," kind of like incremental extension of past principles. But it is a radical departure from prior precedents in the sense that as you keep saying, the underlying story is not deleted. The past rights to delete were all about the underlying story. If my social security number were published in a newspaper, I would have the right to go and ge t that deleted from the newspaper. What we’re talking about here is this very weird Google -specific cabin to particular searches instantiation [spelled phonetically] of a right to force other people to delete information, which is why I think it’s such a bad idea. Paul Nemitz: It’s not Google specific that the case was about Google.

  • 01:00:57

    But, of course, there’s something new with search engines in comparison to a normal press website. And that is that the search engine, by profiling the individua l, when you put in a name, bringing up all the information existing about this individual, goes far deeper in your privacy and profiles you much more than only the one website on which you have your stuff. John Donvan: Sir. Can you tell us your name, ple ase?Male Speaker: Sure. It’s Alameen Sumar [spelled phonetically] . So my question for the "No" side is this: Are there ways — mechanisms to temper maybe things that we see as problematic about the "right to be forgotten" in ways that makes the case more compelling? So, for example, you know, we imagine if this happe ned in the US, the Congress passed a statute and say something like, something’s irrelevant or inadequate, it has to be removed. But then the agency to which that power will be delegated would promulgate some regulations. And the regulations would be a b it more specific. John Donvan: Good q uestion. Male Speaker: And then if, you know, Google wanted to, you know, deny a request or, you know, didn’t want to deal with the process, you could have some special advocate appointed to — John Donvan: Okay.

  • 01:02:01

    So basically, is there an apparatus that would make a "right to be forgotten" workable in the US to the point where you would find it acceptable? Jonathan Zittrain : It is curious you’re asking our side rather than theirs. It points out that the pro side should come up with something that actually can work and address some of the problem s that rea lly are latent in the European model. Could we come up with something? Quite possibly. I’d welcome it. Again, I understand that we have a real pr oblem here in privacy. And I think — and I’ll use some of my closing remarks to try to offer some of my own suggestions — I do worry about a litigation style model as a solution for the kinds of reputational problems we’re talking about. I think it’s a little bit of the wrong kind of peg in the wrong kind of hole. Andrew McLaughlin: I’m not with Jonathan on this one. I can’t think of any administrative construction — any administrative structure that would make this acceptable because the fundamental right that we are talking about here is a duty of other people to delete and forget true past information about you.

  • 01:03:02

    I don’t think there’s any way that’s constitutional for one thing, but I also think it’s just fundamentally a terrible, terrib le idea. John Donvan: Eric, did you want to respond to that? Eric Posner: It’s not unconstitutional. So, the right of privacy already does this. It already has the effect of deterring newspapers and other people from publishing true facts about you. Expungement laws are statutes in most states that allow people’s criminal records to be erased that, you kno w, the facts are gone. All of these statutes and legal norms have standards and criteria. This is something that we can debate about, how strict the criteria would be, and I suspect Jonathan and I, you know, are very close together, meaning that he’s on our side. For the — [laughter] Eric Posner: — you make a — you know, for example, I would completely rule out public figures from using the "Right to Be Forgotten," only private individuals, and only private individuals who didn’t commit a crime, that sort of thing.

  • 01:04:02

    So , I do think we can — that there are administrative criteria. Now, often it’s very difficul t to set up in advance precisely because we’re in a new world with new technology. And so you have to build it up through decisions. That ‘s how our legal system works. And I expect that’s how it would work if the "Right to Be Forgotten" were implemented. John Donvan: [unintelligible] take right down the front in the t hird row, please. And the mic’s coming down your right hand side. And if you could stand up — Female Speaker: [unintelligible] as a follow -up question to yours. In order for your side to win — John Donvan: Just so that the radio audience knows who you’re talking about, the side arguing for the motion — they’re arg uing for the "Right to Be Fo rgotten." Female Speaker: Right, so to me it’s based on an inherent right to privacy, but how do you square that with at least the U.S., going to your constitutional point, that says, you know, with theFourth Amendment search and seizure context, that a reasonable expectation of privacy, you don’t have one the second you share information with anyone, whether it be a conversation with someone or putting it on Facebook, how do you square those ?

  • 01:05:10

    John Donvan: Eric Pos ner. Eric Posner: Well, those are different context. That’s in — the Fourth Amendment is in the context of criminal investigations. And, yeah, there’s a Supreme Court opinion which says what you said but I suspect will be revise d over time. These constitutional norms are not fixed, they are basically made up by the Supreme Court over time. And the Supreme Court has already signaled that it’s going to reconsider many of these constitutional norms in light of new technology. There’s a case inv olving GPS, for example, where the courts understand that their old norms don’t work in a world in which the police can tail people with GPS rather than physically tailing them with cars. So , what the Supreme Court will need to do, going forward, is balan ce interests and free speech with interests in privacy.

  • 01:05:58

    And in doing that, they’ll come up with new norms that probably will respect both interests, the best that can be done. John Donvan: I’m going to — unless you really nee d to respond to that question, I’m going to move on. Female Speaker: For Andrew and Jonathan, I’m thinking of a possibility that might appeal to you. In the past there was more effort and less anonymity. Someone would have to go to the local courthouse, go to the libra ry to find information. They risked passing people, asking someone for help to find a record. Now there’s less effort and more anonymity to just click on something. What if in the case of the Telegraph they had a list of suppressed articles but to acces s those suppressed articles you had to log in, in some fashion or they put up a little sign that said, "You are being traced," just to access the suppressed articles, because suppressed is not the same as delete — John Donvan: Okay, I think we see where y ou’re going with that. Let’s take a — let Jonathan answer that. Jonathan Zittrain:I — John Donvan: Thank you for the question.

  • 01:06:56

    Jonathan Zittrain: — I haven’t bothered to Bing it yet, but there’s either — it’s either Rowan v. Postmaster General or Lamont v. Postmaster General, a wonderful case in which the U.S. Congress passed a law that said, "For anybody interested in receiving communist literature –" I’m not making this up — "you have to regi ster at your local post office. Otherwise , for your own convenience the post office will trash any communist mail otherwise destined for you. The Supreme Court then, and I think the Supreme Court now, found and would find that unconstitutional because it’s exactly the kind of surveillance on who’s wanting to peek under the envelope that is what all sides appear to be concerned about. John Donvan: Another question? Right in the center there. Adamant waving actually does work with me. [laughter] John Donvan: When I see everybody going nuts. Can you tell us your name, please? Again — [talking simultaneously] John Donvan: Yeah, thanks. Male Speaker: This question is for the for side. I’m a developer.

  • 01:08:00

    It would be r elatively trivial for me to create a search engine of my own to crawl major news sources, which would, you know, in the EU it sounds like it’d give me an advantage over other people in terms of the information that I have access to. I’m curious how the law would apply to that, and if it would apply to that, how it would apply in the United States to other private search engines such as the NSAs database of information that they have about us. John Donvan:Okay. So you’re asking if you were smart enough, which sounds like you probably are — [laughter] John Donvan: — to create your own Google at home that would search the web for stuff, does that become relevant in this conversation. I know the law in Europe doesn’t address it, but let’s say does it b ecome relevant in this conversation. Would somebody like you need to become subject to laws like that? Paul or Eric. Paul Nemitz: Well, first of all, a data controller, as we call it, is not someone who collect s information for private use.

  • 01:08:59

    So, as long as you do this for private use you would not, in Europe, fall under our data pro tection law, but of course, to Google, it applies. And indeed, for your question on the NSA, the data protection law applies without difference to the state and to private search engines. So, the right to deletion and the right to ask what do you have about me in Europe applies to the state and also to corporations. But of course, within reason national security limitations and so on, you know, there we are no t so far apart. John Donvan: So , how far are you getting in asking the NSA to delete information? [laughter] Paul Nemitz: Well, the answer is he described a case where he would, you know, as a developer for his private purposes design his own web engi ne, so this constellation would mean that you are not a controller, you would not be treated like Google and therefore the right to be forgotten would not apply in relation to you, because you’re doing this as an i ndividual. So, you would have to delete no thing.

  • 01:09:56

    John Donvan: Sir in the back near the wall. Thanks. Male Speaker: Hi. My name is Jeff Roberts. For the yes side, Paul you wanted to reassure us we’re pointing to existing EU press law that, you know, there’s nothing new here, everyt hing’s legal, but as a north American journalist, European press law scares the hell out of me,particularly U.K. libel law, which has been used by the elite to chill journalists and average people, so perhaps Eric Posner you’d like to address that. John Donvan: Can you rephrase what your question specifically is in terms of the motion here? Male Speaker: In terms of the motion, do we have to fear censorship as a result of this new law coming to effect? Is it possible that the example of European libel law is perhaps, you know, an alarming signal to imposing something like this? John Donvan: In a sense are you saying that the climate for suppression of information in the name of protection of privacy is greater in the U.S. — in the U.K. and in Europe than it is here and should that be something that is relevant here and that we should worry about?

  • 01:11:07

    Am I getting — Male Speaker: Not exactly. I’m saying another way of press suppression — John Donvan: Oh, in that case I’m going to respectfully pass on the question, because I want to keep it on specifically on the right that we’re talking about. Sir in the back in the red sweater, the very far back. Thanks for your question, though. Male Speaker: Tim Havoland. Imagine us 30 year s from now and the two college students at Oklahoma University are now 50 years old. When their kid searches them, should it be the first thing they find, this racist chant they did, or should they not find it at all? [applause] Jonathan Zittrain : I wo uldn’t want Google to be uniquely privileged to answer that question. John Donvan: Does anyone want to respond to that? Eric Posner.

  • 01:11:56

    Eric Posner:I think that would need to be public, because it was a huge public event in this country. I do think that if some kids put a racist statement on Facebook and 30 years from now that came up I would be troubled by that, and the reason for being troubled by that is that people develop over time. You know, the one thing that you do as a 17 -year -old or 16 -year -old doesn’t define you for your life. Even if you did something a lot more serious, committed a crime, for example, the problem is that when you’re 50 now, the person who Googles you may well know nothing about what you’ve done over the last 33 y ears. And they get this hit, and although one would wish that people would take into account the passage of time, I think it’s human nature to jump to conclusions and assume that this 50 -year -old person is the same as the 17 -year -old. John Donvan: From a statistic I read while preparing for this, I understand that in Europe, 89 percent of all Google users never go to the second page.

  • 01:13:00

    They’re only looking at what comes up on the first page. Andrew McLaughlin. Andrew McLaughlin: Well, so I feel this very deeply. The notion that one terrible thing that you do or is associated with you trails you your entire life I think is heartbreaking. And it’s something that concerns me deeply. And I think a lot about that. You know, the internet ha s leapt way ahead of the human capacity to just kind of situate incidents like that in a broader context of somebody’s life. And so, you know, my view on this is, still, censorship, which I don’t mean to be kind of like a nuclear bomb that ends this argument, but to position this as what it is, which is a right to force other people to delete things that they would otherwise rather not delete is not the right answer. We need to have the internet evolve. I’ll like to see Google evolve, to add a more direc t right to respond. And by the way, footnote, they’ve tried.

  • 01:13:56

    They created something called Google plus where anybody could create a profile that shows up very high on your results. You can populate that with any data you want to about yours elf. And I think, broadly, though, as a society, we have to kind of learn how not to judge people in this incredibly harsh way that we seem kind of inclined to do. John Donvan: Paul Nemitz, may I ask you a question in terms of how the thing functions. In giving the deciding power, the adjudicating power to Google, a private corporation, to make the decision yes or no, we’re going to delete or not, what are the checks on that from outside? How does that work?Paul Nemitz: Well, Google as a private corp oration is subject to the law as anybody else. And, of course, its decis ions can be checked. So, if you ask that something to be taken down, and they don’t take it down, then you have, in Europe, the possibility either to go to a court directly and then a judge will decide, or you can go to an independent data protection authority, which is something like an independent agency like the FTC, and they will order Google, if they think the request is justified, to take it down.

  • 01:15:01

    But then Google ha s the possibility, like in Europe and in America, against the decision of the FTC to go to court. Male Speaker: But Paul – Paul Nemitz: This is an area — this is an area which is subject to the law. It’s not a discretionary decision of Google, and that ‘s, you know, the blue sky, and beyond that there’s nothing. Google is here in the same position as any other enterprise. They operate in the realm of law and in Europe, like in the United States, in the end, the law is applied by judges. And what Googl e does, by the way — and I think that is good — is also very much scrutinized by the fourth power. It is good that the press is after Google and says, be more transparent, tell us what you’re doing. Tell us what the criteria are. That is good. And we totally support it. I mean, of course, you know — Male Speaker: Yeah, Paul, I have to — [unintelligible]. John Donvan: [unintelligible]. Andrew McLaughlin: I think you’re evading the central objection of our side to what you’re saying, and it is this: All of the pressures that you just outlined are towards deletion.

  • 01:15:59

    So if the person requesting it is turned down by Google, they can appeal. They can get yet another higher authority to force the deletion. There is, nevertheless, in many of these stories — I mean, on and on through this list that I’ve got here, there are cr imes where there were victims. Some of those victims might want all evidence of that c rime to be disappeared. Fine. But many of the victims may want those storie s to liveon. They may want the memory of their loved ones, the story of that crime, the details of the criminal’s conduct to be remembered. John Donvan: But it does — it does — it does live on. It does live on. It’s not removed from the internet. It’s just you can’t find it through the criminal’s name. Andrew McLaughlin: They may feel that the perpetrator should be associated with that crime. And it may be important to them as a matter of j ustice, maybe not for all the reasons that you might agree with or I might agree with, but it matters to them. My point is, there is a broader interest in many of these stories. Or take the one about the ref eree. The fans of the team that was wronged by the false penalty cal l may want that coach — sorry, referee’s name to be associated with that.

  • 01:17:01

    They have an interest in that. Eric Posner: There are interests on both sides. John Donvan: Eric Posner. Eric Posner: So , that’s the classic rule for the law so that the interests on both sides — Andrew McLaughlin: How are they vindicated? Eric Posner: And they are vindicated, and what’s on the side of disclosure is something far more powerful than the government. It’s t he profit motive. So Google makes money the better it searches up better its searches are, the more money it makes. This is why — [talking simultaneously] Jonathan Zittrain : — talking about. Any given case on the margin, for some obscure search in th e massive river of search taking place — Eric Posner: No, but they don’t want to set a precedence where they –Jonathan Zittrain : Google will pay no penalty for over -deleting. Eric Posner: They will. Jonathan Zittrain : And no one will even know. Eric Posner: They will. Jonathan Zittrain : Who will even know? Eric Posner: They will — they will know because it will do worse than an alternative search engine in the world in which we have search engines competing with each other. John Donvan: And that concludes — that concludes round two of this Intelligence Squared US debate, where our motion is the US should adopt the "right to be forgotten" online.

  • 01:17:58

    [applause] John Donvan: Now we move on to round three. Round three will be closing statements by each debater in turn. They will be two minutes each. And here to make his first — and here to speak first in the closing round, Paul Nemitz. He is director at the EU commission, directorate general for justice and consumers. Paul Nemit z: Ladies and gentlemen, I am defending the case that you should vote in favor of the notion the US should adopt a "right to be forgotten," not necessarily the one of the EU, but a "right to be forgotten." Why should you vote for this notion? Because tec hnology moves on. More and more information will be collected about you by the state and by private parties. And your life w ill be plotted by them. Your life will be profiled. You will be predicted, and you will be manipulated unless you have a tool in law to control your own destiny.

  • 01:19:01

    And in a free society, this right is the right to information of self- determination and it means for Google searches that you can ask for de -linking, provided there are no countervailing interests. And, of course, the most important countervailing interest is democracy, free speech, the information which is relevant for our democracy to function. In the end, what we are discussing here is the power relation between you as an individual, as a free citizen of the United States and on the other hand the power of Google. Is Google free to do anything in terms of plotting you, predicting you and manipulating you? Or do you, as an individual, have standing and rights to defend your freedom? That’s what you’re vo ting about tonight. If you believe individuals should have their destiny in their hand, also in the times when technology make total predict ion, total collection of anything you do possible, then tonight you should vote the US should adopt the "right to be forgotten" online.

  • 01:20:06

    Thank you very much. John Donvan: Thank you, Paul Nemitz. [applause] John Donvan: And that’s our motion, the US should adopt a "right to be forgotten" online. And here to make his closing argument against the motion, Andrew McLaughlin. He is CEO of Dig g and Instapaper and a partner at Betaworks. Andrew McLaughlin: You should vote against the motion. Paul’s construction that he just put forward is a, in my judgment, false construction. The empowerment of an individual on one side of a conversation which is what speech on the internet is, it’s people speaking, other people reading and other people choosing to remember. What he is trying to push is a right to force others to forget true information that they m ay wish to remember. It’s that simple. It is an extreme solution to a very real problem.

  • 01:20:53

    It’s the wrong solution to a very real problem, particularly where there are other paths, more speech, a right to respond, new tools, the evolution of th e social understanding of past mistakes, to deal with embarrassing mistakes that come out on the internet. I want to say, in response to one other point that Eric raised, simply calling this what it is, which is censorship, is indeed not the end of the de bate. We censor copyright infringement, we censor sexual abuse images of children, we censor a lot of differentkinds of information. But it’s justified and overcomes a very high bar in order to stand as an exception to the right of free speech. This ca se, the affirmative case, has not met their burden to say that the right to force other people to delete embarrassing facts from your past rises to that same level. It does not. And for that reason, you should vote against the motion. The final image th at I want to leave you with is one that sticks in my mind whenever I think about this.

  • 01:21:56

    And that is the ever -evolving photographs of the Soviet politburo through the 1930s and 1940s. It’s captured also in fiction in 1984. But you can see photo graphs where Stalin would fall out with one of his politburo members and, sure enough, the next time they published the yearbook or the history book or the textbook, the new edition would have them completely etched out. We are talking about forgetting hi story, forgetting the past. The right to remember, is as important as the right to speak. And for that reason, you should oppose this motion. John Donvan: Thank you, Andrew McLaughlin. [applause] John Donvan: And the motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position supporting this motion, Eric Posner. He’s a professor of law at the University of Chicago. Eric Posner: This debate is really not about politics. It’s not about erasing images and st atements that displease the government. The debate is about privacy. Privacy is a right in Europe. It’s a right in the United States. It’s been a right in the United States for a very long time.

  • 01:22:54

    It used to be controlled by tort law, an area of the law, and to some extent statutory law and other types of law. And I just want to ask you to imagine again but this time that it’s you, that perhaps you’re going through hard times in your marriage if you are married, and perhaps you get involved in a divorce. And divorces always work out a lot worse than people want. People exaggerate. They make allegations. So , imagine it’s you, and your spouse says, falsely maybe, that you neglected the children or maybe a statement that you made about your own children that you wouldn’t want them to hear is repeated in the court documents. In the old days nothing much would come from this. Today, imagine it now, this is what people learn about you, this is the first thingthat people learn about you when they Google your name. So you go out and you talk to a potential romantic partner or a boss or a neighbor, this is the first thing they’re going to hear about you, and they’re going to act on it.

  • 01:24:05

    Now, Andrew quoted Orwell, I believe it was, "He who controls the past controls the future." Orwell’s invocation was inevitable. But I want to ask you, though, who is the "he" in that statement? Is the "he" you, is the "he" me, or is the "he" Google? Thank you very much. John Donvan: Thank you, Eric Posner. [applause] John Donvan: The motion is "The U.S. Should Adopt the Right to Be Forgotten Online." And here to summarize his position against this motion, Jonathan Zittrain. He is cofounder of The Berkman Center for Internet & Society. Jonat han Zittrain: I can’t believe in the last slot I’m still the first to be able to thank everybody, Paul, Eric, John, Andrew, and all of you for an unforgettable evening. [laughter] This has been wonderful. Sorry. We’ll strike that later. We’ve heard about a lot of nightmare stories.

  • 01:25:03

    Everything we do is getting immediately streamed onto the web. Let’s hope we don’t get to that place. And I will happily join the efforts of academics, of corporate folk, and of governments to have that reality not come about. But notice how what the four folks were saying over the course of the evening, it’s shifting. It’s a standard that’s — "Well, we can tweak that. We’ll make sure that only the good stuff goes out and only the bad stuff goes out." But then I come to the question Eric just asked, who is "we"? And in the proposal "we" is Google, not just being ordered to do something but being the decider reviewable only if the complainant is not happy with the result. That is basically built — wired into the proposal. And I didn’t hear from the other side answers to questions like, "When Google makes those decisions, is the public allowed to know?

  • 01:26:01

    Are the websites themselves that are affected allowed to know?" Paul has been working I think to make sure not on a specific case, only on the overall criteria, that seems very, very dangerous to me. Now, John said we didn’t know solutions. And in 30 seconds, it’s true that we can’t unpack them very well. But I’d encourage people to look at solutions very specific to very sensitive systems of records like legal documents and court materials, that options for contextualization to include possibly even somebody like Google deciding, and when it’s a search on a name maybe the first result shouldn’t be those 10 terrible whatever comes out of the roulette wheel links but a curated page marked as such and then many won’t go to the second page, reputation systems and, of course competition– John Donvan: Jonathan Zittrain, I’m sorry — Jonathan Zittrain: — can I end with one thing — John Donvan: — your time is up. No [spelled phonetically]. Thank you very much, just in fairness to everybody. Thank you, Jonathan. [laughter] [applause] And that concludes our closing statements.

  • 01:27:04

    And now it’s time to learn which side you feel has argued the best. We’re going to ask you again to go to the keypads at your seat and, again, look at this motion "The U.S. Should Adopt the Right to Be Forgotten Online." If you have been persuaded that this motion is — has been well argued, push number one. If you are on the opposite side, push number two, and if you’re undecided push number three. Then we’ll have the results in just a moment. Before we do that I want to say that this w as a little bit of a tough debate in terms of it was relatively nuanced and I really want to congratulate, in a couple of ways, the teams for helping bring this to all of us and making it clear. Paul Nemitz for devoting a lot of your opening statement to actually explaining so clearly what we’re talking about. You just did us all a service. It was a very teachable moment for all of us. So, thank you for doing that.

  • 01:28:02

    [applause] John Donvan: And for the side arguing against the motion, although they were not obliged to tell us what their better plan was or what else could work, it was actually useful to all of us to hear that. So , I want to thank you for that. And Jonathan, what is it you we re going to say before I cut you off? Now that the voting is done . [talking simultaneously] Jonathan Zittrain: H ere was the landing I was hoping to make. We all agree that people are not as bad as their own worst deed and should not be judged by that. By the same token, institutions are not as good as their own press releases and we can’t trust them to be so. [a pplause] John Donvan: Well, I’m sorry the clock is my master and has to be your master, but the people voted before hearing that, but thanks very much. I just wanted to give you your moment.

  • 01:28:54

    I also want to let — tell everybody who got up a nd asked a question, even the ones that I had to pass on, it takes a lot of guts to do and the questions were actually superb tonight, so thank you everybody who got up and asked a question. I also want to thank our generous donors and supporters, some of whom are here tonight. For those of you who don’t know, we are a non -profit organization and the ticket sales don’t come close to covering the cost of mounting a great debate like this. So, I’d encourage you if you would be willing to make a donation to go to our website, that’s IQ2US.org and make a donation there. Our next debate is later this month. It’s at Columbia University at the Miller Theater. We’re partnering with the Richard Paul Richmond Center for Business Law and Public Policy at Columbia, as well as with the National Constitution Center, we’re going to be continuing our ongoing debates on constitutional issues that we started last year in Philadelphia. The topic at Columbia will cover whether the president has exceeded his constitutional authority to wage war .

  • 01:29:55

    A nd in April we’re going to be back here on the 15th in this theater and our motion will be Abolish the Death Penalty. We hope to see a lot of you at both of those debates and I want to get to the results now, because it is all in. We have the final results. Onceagain the motion was The U.S. Should Adopt the "Right to be Forgotten" Online. We’ve heard the arguments — sorry? Did I — sorry? There’s chatter that I thought was response to some sort of mistake I mad e. Oh, it came up already? Male Speaker: It was erased. [laughter] Male Speaker: Does anybody remember? [applause] John Donvan: All right. Well, for the sake of people who are listening — [laughter] Play along, please. [laughter] Our motion is this: The U.S. Should Adopt the " Right to be Forgotten" Online.

  • 01:30:56

    Remember, the team whose numbers have moved the most between the two votes is the team that will be declared our winner. Let’s look at the first vote. The U.S. Should Adopt the "Right to be Forgotten" Online and the first vote 36 percent agreed, 26 percent were against, 38 percent were undecided. In the second vote the result was for the team arguing for the motion their second vote was 35 percent, that means they lost one percentage point. The team arguing against the motion, their first vote was 26 percent, second was 56 percent. They went up 36 percent. [applause] John Donvan: The team arguing against this motion of a right to be forgotten online has won this Intelligence Squared U.S. debate. Thank you for me, John Donvan, and Intelligence Squared U.S. We’ll see you next time. [applause]

  • 01:31:40

down

Related News (12 RESOURCES)

JOIN THE CONVERSATION
33

Have an idea for a debate or have a question for the Open to Debate Team?

DEBATE COMMUNITY
Join a community of social and intellectual leaders that truly value the free exchange of ideas.
EDUCATIONAL BRIEFS
Readings on our weekly debates, debater editorials, and news on issues that affect our everyday lives.
SUPPORT OPEN-MINDED DEBATE
Help us bring debate to communities and classrooms across the nation.